↜ Back to index Introduction to Numerical Analysis 2

Lecture a7

Gaussian elimination algorithm

Now we can solve any linear system Ax = b with a regular (invertible) matrix A and any right-hand side b.

But to use this algorithm in practice, we need to also know how much time the algorithm will take for bigger linear systems in practice.

Question. How much “work” does the algorithm need to perform to solve the linear system for a given n?

By work we mean the simple operations that the computer can perform like addition of two numbers, multiplication of two numbers and so on. Modern computers can perform about 10^9~10^{12} such operations per second. By estimating the number needed by the algorithm, we can then estimate the computation time.

Estimating the number of operations

We do not need to count all the operations exactly, in fact, a simple estimate is enough.

The most operations are performed in the nested loops, so we will focus on that.

Gaussian elimination has 3 nested loops:

do k = 1, n

    ! pivot seraching, etc. ...

    do i = k + 1, n
        do j = k, n
            a(i, j) = a(i, j) + c * a(k, j)
        end do
    end do

    ! operations on b
end do

The command a(i, j) = a(i, j) + c * a(k, j) consists of 1 multiplication and 1 addition.

These 2 operations are performed for j = k, n, that is, n - k + 1 times, for a total of 2(n - k + 1).

This inner loop is performed for i = k + 1, n, n - k times, for a total of 2(n - k + 1)(n -k) operations.

Finally, the outer loop is performed k = 1, n for a total number of \sum_{k=1}^n 2 (n-k+1) (n-k) = \tfrac 23(n^3 - n^2).

For n large, the largest is the leading term n^3.

n number of operations
5 125
10 1000
100 10^6
1000 10^9

We see that the number of required operations grows very fast with n. But on a modern computer we can do all these under 1 second.

However, in practice, we might need to solve systems with n = 10^9 and more in cases of simulations in engineering, computer graphics, weather prediction, data processing. This would take 10^{27} operations…

One therefore needs faster algorithms in practice.

One way to speed up solving a linear system Ax = b is a modification of the Gaussian elimination called the LU decomposition.

A way to speed up Gaussian elimination

The Gaussian elimination algorithm that we implemented proceeds in two steps:

  1. Convert the system Ax = b into a system Ux = \tilde b with a upper triangular matrix U and a modified vector \tilde b. This performs elimination by adding rows of the matrix and swapping the rows for pivoting.

  2. Solve the system Ux = \tilde b for x by backsubstitution.

The amount of work we need to perform in each step can be quite easily estimated:

  1. Finding U takes about (on the order of) n^3 operations as explained above. On the other hand, computing \tilde b from b takes only about n^2 operations.

  2. Solving Ux = \tilde b takes only about n^2 operations.

In practice, the matrix A is often fixed and we need to solve the system Ax = b for different right hand sides b. Therefore, if we save U and understand the way to recover \tilde b from b, we need to perform n^3 amount of work only once, and then need to perform only about n^2 operations to find the solution Ax = b whenever we change b.

We will see that \tilde b can be found as the solution of L\tilde b = b, where L is a lower triangular matrix. The algorithm that finds L and U from A is called LU decomposition.